Feature-based non-parametric estimation of Kullback–Leibler divergence for SAR image change detection
نویسندگان
چکیده
In this article, a method based on a non-parametric estimation of the Kullback–Leibler divergence using a local feature space is proposed for synthetic aperture radar (SAR) image change detection. First, local features based on a set of Gabor filters are extracted from both preand post-event images. The distribution of these local features from a local neighbourhood is considered as a statistical representation of the local image information. The Kullback–Leibler divergence as a probabilistic distance is used for measuring the similarity of the two distributions. Nevertheless, it is not trivial to estimate the distribution of a high-dimensional random vector, let alone the comparison of two distributions. Thus, a non-parametric method based on k-nearest neighbour search is proposed to compute the Kullback–Leibler divergence between the two distributions. Through experiments, this method is compared with other state-of-the-art methods and the effectiveness of the proposed method for SAR image change detection is demonstrated. ARTICLE HISTORY Received 8 May 2016 Accepted 29 June 2016
منابع مشابه
Change Detection for Earthquake-damaged and Reconstructed Urban Area on Sar Images
Considering the purpose of change detection task, this paper extracts the urban area from the filtered co-registered original image with the method based on variogram first. Then, it takes advantage of generalized Gamma model to fit SAR images, in order to gain the characteristics information, such as radiation value, local texture, etc, because of the clutter statistical characteristics of SAR...
متن کاملAlpha-Divergence for Classification, Indexing and Retrieval (Revised 2)
Motivated by Chernoff’s bound on asymptotic probability of error we propose the alpha-divergence measure and a surrogate, the alpha-Jensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha-divergence, also known as Renyi divergence, is a generalization of the Kullback-Liebler divergence and the Hellinger affinity between the probability densi...
متن کاملAlpha-Divergence for Classification, Indexing and Retrieval0 (Revised)
Motivated by Chernoff’s bound on asymptotic probability of error we propose the alpha-divergence measure and a surrogate, the alpha-Jensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha-divergence, also known as Renyi divergence, is a generalization of the Kullback-Liebler divergence and the Hellinger affinity between the probability densi...
متن کاملModel Confidence Set Based on Kullback-Leibler Divergence Distance
Consider the problem of estimating true density, h(.) based upon a random sample X1,…, Xn. In general, h(.)is approximated using an appropriate in some sense, see below) model fƟ(x). This article using Vuong's (1989) test along with a collection of k(> 2) non-nested models constructs a set of appropriate models, say model confidence set, for unknown model h(.).Application of such confide...
متن کاملAlpha-Divergence for Classification, Indexing and Retrieval
Motivated by Chernoff’s bound on asymptotic probability of error we propose the alpha-divergence measure and a surrogate, the alpha-Jensen difference, for feature classification, indexing and retrieval in image and other databases. The alpha-divergence, also known as Renyi divergence, is a generalization of the Kullback-Liebler divergence and the Hellinger affinity between the probability densi...
متن کامل